Skip to content

Added sigmoid like activation functions #9011

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

AIexanderDicke
Copy link

Describe your change:

  • Add an algorithm?
  • Fix a bug or typo in an existing algorithm?
  • Documentation change?

Checklist:

  • I have read CONTRIBUTING.md.
  • This pull request is all my own work -- I have not plagiarized.
  • I know that pull requests will not be merged if they fail the automated tests.
  • This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
  • All new Python files are placed inside an existing directory.
  • All filenames are in all lowercase characters with no spaces or dashes.
  • All functions and variable names follow Python naming conventions.
  • All function parameters and return values are annotated with Python type hints.
  • All functions have doctests that pass the automated testing.
  • All new algorithms include at least one URL that points to Wikipedia or another similar explanation.
  • If this pull request resolves one or more open issues then the description above includes the issue number(s) with a closing keyword: "Fixes #ISSUE-NUMBER".

Copy link
Contributor

@rohan472000 rohan472000 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For ensuring clarity and ease of parameter interpretation across the functions.

>>> np.linalg.norm(np.array([0.5, 0.66666667, 0.83333333]) - result) < 10**(-5)
True
"""
return _base_activation(vector, 0, 1)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return _base_activation(vector, 0, 1)
return _base_activation(vector, alpha=0, beta=1)

>>> np.linalg.norm(np.array([0, 0.66666667, 1.6]) - result) < 10**(-5)
True
"""
return _base_activation(vector, 1, beta)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return _base_activation(vector, 1, beta)
return _base_activation(vector, alpha=1, beta=beta)

>>> np.linalg.norm(np.array([0, 0.7310585, 0.462098]) - result) < 10**(-5)
True
"""
return swish(vector, 1)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return swish(vector, 1)
return swish(vector, beta=1)

@algorithms-keeper algorithms-keeper bot added require tests Tests [doctest/unittest/pytest] are required awaiting reviews This PR is ready to be reviewed labels Sep 7, 2023
Copy link

@algorithms-keeper algorithms-keeper bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Click here to look at the relevant links ⬇️

🔗 Relevant Links

Repository:

Python:

Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.

algorithms-keeper commands and options

algorithms-keeper actions can be triggered by commenting on this PR:

  • @algorithms-keeper review to trigger the checks for only added pull request files
  • @algorithms-keeper review-all to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.

NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.

import numpy as np


def _base_activation(vector: np.ndarray, alpha: float, beta: float) -> np.ndarray:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/activation_functions/sigmoid_like.py, please provide doctest for the function _base_activation

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The examples that you have kept under comments, make it doctest, remove that Example word, and merge both statements i,e.. result and np.linalg....

 >>> np.linalg.norm(np.array([0.5, 0.66666667, 
        0.83333333]) - ( _base_activation(np.array([0, 
        np.log(2), np.log(5)]), 0, 1))) < 10**(-5)
 True

@algorithms-keeper algorithms-keeper bot removed the require tests Tests [doctest/unittest/pytest] are required label Sep 7, 2023
import numpy as np


def _base_activation(vector: np.ndarray, alpha: float, beta: float) -> np.ndarray:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The examples that you have kept under comments, make it doctest, remove that Example word, and merge both statements i,e.. result and np.linalg....

 >>> np.linalg.norm(np.array([0.5, 0.66666667, 
        0.83333333]) - ( _base_activation(np.array([0, 
        np.log(2), np.log(5)]), 0, 1))) < 10**(-5)
 True

Copy link
Contributor

@tianyizheng02 tianyizheng02 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personally, I think it's better to implement the three activation functions without using a shared helper function. While it might not be as elegant, I think it's better from an educational standpoint for users to be able to see the explicit formula for each of the functions.

Also, we already have the sigmoid and SiLU in the maths/ directory. However, I'd rather we have these functions in neural_network/activation_functions like you did, so we should delete these two existing files in favor of yours.

Copy link
Contributor

@tianyizheng02 tianyizheng02 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just small improvements, but other than that LGTM

Comment on lines 12 to 13
>>> np.linalg.norm(np.array([0.5, 0.66666667, 0.83333333]) \
- sigmoid(vector=np.array([0, np.log(2), np.log(5)]))) < 10**(-5)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
>>> np.linalg.norm(np.array([0.5, 0.66666667, 0.83333333]) \
- sigmoid(vector=np.array([0, np.log(2), np.log(5)]))) < 10**(-5)
>>> np.linalg.norm(np.array([0.5, 0.66666667, 0.83333333])
... - sigmoid(vector=np.array([0, np.log(2), np.log(5)]))) < 10**(-5)

I believe you can use ... to avoid using \

- sigmoid(vector=np.array([0, np.log(2), np.log(5)]))) < 10**(-5)
True
"""
return 1 / (1 + np.exp(-1 * vector))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return 1 / (1 + np.exp(-1 * vector))
return 1 / (1 + np.exp(-vector))

Just slightly more concise

Comment on lines 28 to 32
>>> np.linalg.norm(np.array([0.5, 1., 1.5]) \
- swish(np.array([1, 2, 3]), 0)) < 10**(-5)
True
>>> np.linalg.norm(np.array([0, 0.66666667, 1.6]) \
- swish(np.array([0, 1, 2]), np.log(2))) < 10**(-5)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
>>> np.linalg.norm(np.array([0.5, 1., 1.5]) \
- swish(np.array([1, 2, 3]), 0)) < 10**(-5)
True
>>> np.linalg.norm(np.array([0, 0.66666667, 1.6]) \
- swish(np.array([0, 1, 2]), np.log(2))) < 10**(-5)
>>> np.linalg.norm(np.array([0.5, 1., 1.5])
... - swish(np.array([1, 2, 3]), 0)) < 10**(-5)
True
>>> np.linalg.norm(np.array([0, 0.66666667, 1.6])
... - swish(np.array([0, 1, 2]), np.log(2))) < 10**(-5)

Comment on lines 46 to 47
>>> np.linalg.norm(np.array([0, 0.7310585, 0.462098]) \
- sigmoid_linear_unit(np.array([0, 1, np.log(2)]))) < 10**(-5)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
>>> np.linalg.norm(np.array([0, 0.7310585, 0.462098]) \
- sigmoid_linear_unit(np.array([0, 1, np.log(2)]))) < 10**(-5)
>>> np.linalg.norm(np.array([0, 0.7310585, 0.462098])
... - sigmoid_linear_unit(np.array([0, 1, np.log(2)]))) < 10**(-5)

- sigmoid_linear_unit(np.array([0, 1, np.log(2)]))) < 10**(-5)
True
"""
return vector / (1 + np.exp(-1 * vector))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return vector / (1 + np.exp(-1 * vector))
return vector / (1 + np.exp(-vector))

@algorithms-keeper algorithms-keeper bot added the tests are failing Do not merge until tests pass label Sep 17, 2023
@rohan472000
Copy link
Contributor

#9078 is merged, now this build will pass, run it again.

@cclauss cclauss closed this Sep 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting reviews This PR is ready to be reviewed tests are failing Do not merge until tests pass
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants